Console Output

Training and evaluating model for: Fridge
Dataset length: 20362 windows


 NILMModel(
  (conv1d): Conv1d(9, 9, kernel_size=(3,), stride=(1,), padding=(1,))
  (lstm): LSTM(9, 128, num_layers=3, batch_first=True, dropout=0.1)
  (dropout): Dropout(p=0.1, inplace=False)
  (relu): ReLU()
  (output_layer): Linear(in_features=128, out_features=1, bias=True)
)
Epoch [1/300], Train Loss: 0.000708
Validation Loss: 0.000653
Epoch [2/300], Train Loss: 0.000666
Validation Loss: 0.000640
Epoch [3/300], Train Loss: 0.000634
Validation Loss: 0.000610
Epoch [4/300], Train Loss: 0.000611
Validation Loss: 0.000603
Epoch [5/300], Train Loss: 0.000607
Validation Loss: 0.000601
Epoch [6/300], Train Loss: 0.000605
Validation Loss: 0.000600
Epoch [7/300], Train Loss: 0.000604
Validation Loss: 0.000600
Epoch [8/300], Train Loss: 0.000602
Validation Loss: 0.000597
Epoch [9/300], Train Loss: 0.000600
Validation Loss: 0.000594
Epoch [10/300], Train Loss: 0.000599
Validation Loss: 0.000593
Epoch [11/300], Train Loss: 0.000596
Validation Loss: 0.000598
Epoch [12/300], Train Loss: 0.000595
Validation Loss: 0.000588
Epoch [13/300], Train Loss: 0.000593
Validation Loss: 0.000586
Epoch [14/300], Train Loss: 0.000592
Validation Loss: 0.000588
Epoch [15/300], Train Loss: 0.000590
Validation Loss: 0.000581
Epoch [16/300], Train Loss: 0.000588
Validation Loss: 0.000580
Epoch [17/300], Train Loss: 0.000586
Validation Loss: 0.000578
Epoch [18/300], Train Loss: 0.000584
Validation Loss: 0.000576
Epoch [19/300], Train Loss: 0.000582
Validation Loss: 0.000573
Epoch [20/300], Train Loss: 0.000580
Validation Loss: 0.000570
Epoch [21/300], Train Loss: 0.000578
Validation Loss: 0.000566
Epoch [22/300], Train Loss: 0.000575
Validation Loss: 0.000566
Epoch [23/300], Train Loss: 0.000571
Validation Loss: 0.000558
Epoch [24/300], Train Loss: 0.000568
Validation Loss: 0.000554
Epoch [25/300], Train Loss: 0.000562
Validation Loss: 0.000552
Epoch [26/300], Train Loss: 0.000559
Validation Loss: 0.000545
Epoch [27/300], Train Loss: 0.000555
Validation Loss: 0.000544
Epoch [28/300], Train Loss: 0.000551
Validation Loss: 0.000538
Epoch [29/300], Train Loss: 0.000547
Validation Loss: 0.000533
Epoch [30/300], Train Loss: 0.000543
Validation Loss: 0.000525
Epoch [31/300], Train Loss: 0.000537
Validation Loss: 0.000520
Epoch [32/300], Train Loss: 0.000531
Validation Loss: 0.000515
Epoch [33/300], Train Loss: 0.000525
Validation Loss: 0.000509
Epoch [34/300], Train Loss: 0.000517
Validation Loss: 0.000501
Epoch [35/300], Train Loss: 0.000511
Validation Loss: 0.000499
Epoch [36/300], Train Loss: 0.000504
Validation Loss: 0.000487
Epoch [37/300], Train Loss: 0.000498
Validation Loss: 0.000482
Epoch [38/300], Train Loss: 0.000491
Validation Loss: 0.000478
Epoch [39/300], Train Loss: 0.000490
Validation Loss: 0.000481
Epoch [40/300], Train Loss: 0.000484
Validation Loss: 0.000471
Epoch [41/300], Train Loss: 0.000479
Validation Loss: 0.000479
Epoch [42/300], Train Loss: 0.000480
Validation Loss: 0.000470
Epoch [43/300], Train Loss: 0.000476
Validation Loss: 0.000466
Epoch [44/300], Train Loss: 0.000473
Validation Loss: 0.000463
Epoch [45/300], Train Loss: 0.000469
Validation Loss: 0.000456
Epoch [46/300], Train Loss: 0.000464
Validation Loss: 0.000455
Epoch [47/300], Train Loss: 0.000463
Validation Loss: 0.000453
Epoch [48/300], Train Loss: 0.000457
Validation Loss: 0.000446
Epoch [49/300], Train Loss: 0.000454
Validation Loss: 0.000446
Epoch [50/300], Train Loss: 0.000452
Validation Loss: 0.000440
Epoch [51/300], Train Loss: 0.000448
Validation Loss: 0.000447
Epoch [52/300], Train Loss: 0.000445
Validation Loss: 0.000442
Epoch [53/300], Train Loss: 0.000443
Validation Loss: 0.000451
Epoch [54/300], Train Loss: 0.000441
Validation Loss: 0.000445
Epoch [55/300], Train Loss: 0.000440
Validation Loss: 0.000430
Epoch [56/300], Train Loss: 0.000434
Validation Loss: 0.000426
Epoch [57/300], Train Loss: 0.000432
Validation Loss: 0.000423
Epoch [58/300], Train Loss: 0.000431
Validation Loss: 0.000419
Epoch [59/300], Train Loss: 0.000428
Validation Loss: 0.000418
Epoch [60/300], Train Loss: 0.000424
Validation Loss: 0.000413
Epoch [61/300], Train Loss: 0.000423
Validation Loss: 0.000408
Epoch [62/300], Train Loss: 0.000419
Validation Loss: 0.000408
Epoch [63/300], Train Loss: 0.000418
Validation Loss: 0.000411
Epoch [64/300], Train Loss: 0.000417
Validation Loss: 0.000409
Epoch [65/300], Train Loss: 0.000415
Validation Loss: 0.000409
Epoch [66/300], Train Loss: 0.000412
Validation Loss: 0.000403
Epoch [67/300], Train Loss: 0.000412
Validation Loss: 0.000395
Epoch [68/300], Train Loss: 0.000408
Validation Loss: 0.000393
Epoch [69/300], Train Loss: 0.000407
Validation Loss: 0.000403
Epoch [70/300], Train Loss: 0.000407
Validation Loss: 0.000389
Epoch [71/300], Train Loss: 0.000404
Validation Loss: 0.000389
Epoch [72/300], Train Loss: 0.000402
Validation Loss: 0.000390
Epoch [73/300], Train Loss: 0.000400
Validation Loss: 0.000385
Epoch [74/300], Train Loss: 0.000398
Validation Loss: 0.000386
Epoch [75/300], Train Loss: 0.000398
Validation Loss: 0.000387
Epoch [76/300], Train Loss: 0.000396
Validation Loss: 0.000388
Epoch [77/300], Train Loss: 0.000395
Validation Loss: 0.000385
Epoch [78/300], Train Loss: 0.000394
Validation Loss: 0.000384
Epoch [79/300], Train Loss: 0.000391
Validation Loss: 0.000380
Epoch [80/300], Train Loss: 0.000390
Validation Loss: 0.000379
Epoch [81/300], Train Loss: 0.000390
Validation Loss: 0.000375
Epoch [82/300], Train Loss: 0.000388
Validation Loss: 0.000381
Epoch [83/300], Train Loss: 0.000386
Validation Loss: 0.000373
Epoch [84/300], Train Loss: 0.000386
Validation Loss: 0.000372
Epoch [85/300], Train Loss: 0.000385
Validation Loss: 0.000369
Epoch [86/300], Train Loss: 0.000382
Validation Loss: 0.000368
Epoch [87/300], Train Loss: 0.000383
Validation Loss: 0.000366
Epoch [88/300], Train Loss: 0.000381
Validation Loss: 0.000374
Epoch [89/300], Train Loss: 0.000379
Validation Loss: 0.000363
Epoch [90/300], Train Loss: 0.000378
Validation Loss: 0.000360
Epoch [91/300], Train Loss: 0.000375
Validation Loss: 0.000369
Epoch [92/300], Train Loss: 0.000374
Validation Loss: 0.000358
Epoch [93/300], Train Loss: 0.000372
Validation Loss: 0.000356
Epoch [94/300], Train Loss: 0.000372
Validation Loss: 0.000356
Epoch [95/300], Train Loss: 0.000370
Validation Loss: 0.000356
Epoch [96/300], Train Loss: 0.000368
Validation Loss: 0.000355
Epoch [97/300], Train Loss: 0.000368
Validation Loss: 0.000357
Epoch [98/300], Train Loss: 0.000366
Validation Loss: 0.000350
Epoch [99/300], Train Loss: 0.000364
Validation Loss: 0.000353
Epoch [100/300], Train Loss: 0.000363
Validation Loss: 0.000351
Epoch [101/300], Train Loss: 0.000363
Validation Loss: 0.000348
Epoch [102/300], Train Loss: 0.000360
Validation Loss: 0.000343
Epoch [103/300], Train Loss: 0.000359
Validation Loss: 0.000346
Epoch [104/300], Train Loss: 0.000359
Validation Loss: 0.000343
Epoch [105/300], Train Loss: 0.000356
Validation Loss: 0.000344
Epoch [106/300], Train Loss: 0.000356
Validation Loss: 0.000341
Epoch [107/300], Train Loss: 0.000353
Validation Loss: 0.000337
Epoch [108/300], Train Loss: 0.000356
Validation Loss: 0.000348
Epoch [109/300], Train Loss: 0.000353
Validation Loss: 0.000349
Epoch [110/300], Train Loss: 0.000350
Validation Loss: 0.000334
Epoch [111/300], Train Loss: 0.000349
Validation Loss: 0.000335
Epoch [112/300], Train Loss: 0.000347
Validation Loss: 0.000340
Epoch [113/300], Train Loss: 0.000349
Validation Loss: 0.000334
Epoch [114/300], Train Loss: 0.000347
Validation Loss: 0.000338
Epoch [115/300], Train Loss: 0.000346
Validation Loss: 0.000329
Epoch [116/300], Train Loss: 0.000344
Validation Loss: 0.000329
Epoch [117/300], Train Loss: 0.000345
Validation Loss: 0.000338
Epoch [118/300], Train Loss: 0.000344
Validation Loss: 0.000327
Epoch [119/300], Train Loss: 0.000342
Validation Loss: 0.000332
Epoch [120/300], Train Loss: 0.000341
Validation Loss: 0.000327
Epoch [121/300], Train Loss: 0.000340
Validation Loss: 0.000331
Epoch [122/300], Train Loss: 0.000339
Validation Loss: 0.000324
Epoch [123/300], Train Loss: 0.000338
Validation Loss: 0.000325
Epoch [124/300], Train Loss: 0.000339
Validation Loss: 0.000324
Epoch [125/300], Train Loss: 0.000339
Validation Loss: 0.000326
Epoch [126/300], Train Loss: 0.000338
Validation Loss: 0.000323
Epoch [127/300], Train Loss: 0.000336
Validation Loss: 0.000323
Epoch [128/300], Train Loss: 0.000336
Validation Loss: 0.000323
Epoch [129/300], Train Loss: 0.000336
Validation Loss: 0.000321
Epoch [130/300], Train Loss: 0.000335
Validation Loss: 0.000321
Epoch [131/300], Train Loss: 0.000333
Validation Loss: 0.000318
Epoch [132/300], Train Loss: 0.000333
Validation Loss: 0.000325
Epoch [133/300], Train Loss: 0.000332
Validation Loss: 0.000320
Epoch [134/300], Train Loss: 0.000331
Validation Loss: 0.000319
Epoch [135/300], Train Loss: 0.000332
Validation Loss: 0.000317
Epoch [136/300], Train Loss: 0.000330
Validation Loss: 0.000318
Epoch [137/300], Train Loss: 0.000329
Validation Loss: 0.000316
Epoch [138/300], Train Loss: 0.000329
Validation Loss: 0.000316
Epoch [139/300], Train Loss: 0.000329
Validation Loss: 0.000314
Epoch [140/300], Train Loss: 0.000329
Validation Loss: 0.000316
Epoch [141/300], Train Loss: 0.000327
Validation Loss: 0.000313
Epoch [142/300], Train Loss: 0.000327
Validation Loss: 0.000316
Epoch [143/300], Train Loss: 0.000327
Validation Loss: 0.000316
Epoch [144/300], Train Loss: 0.000326
Validation Loss: 0.000314
Epoch [145/300], Train Loss: 0.000325
Validation Loss: 0.000314
Epoch [146/300], Train Loss: 0.000324
Validation Loss: 0.000310
Epoch [147/300], Train Loss: 0.000325
Validation Loss: 0.000310
Epoch [148/300], Train Loss: 0.000324
Validation Loss: 0.000312
Epoch [149/300], Train Loss: 0.000323
Validation Loss: 0.000309
Epoch [150/300], Train Loss: 0.000323
Validation Loss: 0.000310
Epoch [151/300], Train Loss: 0.000321
Validation Loss: 0.000308
Epoch [152/300], Train Loss: 0.000320
Validation Loss: 0.000307
Epoch [153/300], Train Loss: 0.000320
Validation Loss: 0.000315
Epoch [154/300], Train Loss: 0.000320
Validation Loss: 0.000311
Epoch [155/300], Train Loss: 0.000320
Validation Loss: 0.000307
Epoch [156/300], Train Loss: 0.000319
Validation Loss: 0.000310
Epoch [157/300], Train Loss: 0.000319
Validation Loss: 0.000307
Epoch [158/300], Train Loss: 0.000318
Validation Loss: 0.000304
Epoch [159/300], Train Loss: 0.000317
Validation Loss: 0.000307
Epoch [160/300], Train Loss: 0.000318
Validation Loss: 0.000307
Epoch [161/300], Train Loss: 0.000317
Validation Loss: 0.000304
Epoch [162/300], Train Loss: 0.000317
Validation Loss: 0.000304
Epoch [163/300], Train Loss: 0.000315
Validation Loss: 0.000305
Epoch [164/300], Train Loss: 0.000315
Validation Loss: 0.000305
Epoch [165/300], Train Loss: 0.000315
Validation Loss: 0.000302
Epoch [166/300], Train Loss: 0.000314
Validation Loss: 0.000301
Epoch [167/300], Train Loss: 0.000314
Validation Loss: 0.000303
Epoch [168/300], Train Loss: 0.000314
Validation Loss: 0.000303
Epoch [169/300], Train Loss: 0.000314
Validation Loss: 0.000300
Epoch [170/300], Train Loss: 0.000313
Validation Loss: 0.000305
Epoch [171/300], Train Loss: 0.000312
Validation Loss: 0.000300
Epoch [172/300], Train Loss: 0.000311
Validation Loss: 0.000301
Epoch [173/300], Train Loss: 0.000311
Validation Loss: 0.000298
Epoch [174/300], Train Loss: 0.000311
Validation Loss: 0.000300
Epoch [175/300], Train Loss: 0.000311
Validation Loss: 0.000310
Epoch [176/300], Train Loss: 0.000310
Validation Loss: 0.000298
Epoch [177/300], Train Loss: 0.000310
Validation Loss: 0.000301
Epoch [178/300], Train Loss: 0.000310
Validation Loss: 0.000299
Epoch [179/300], Train Loss: 0.000308
Validation Loss: 0.000301
Epoch [180/300], Train Loss: 0.000308
Validation Loss: 0.000298
Epoch [181/300], Train Loss: 0.000309
Validation Loss: 0.000299
Epoch [182/300], Train Loss: 0.000307
Validation Loss: 0.000295
Epoch [183/300], Train Loss: 0.000306
Validation Loss: 0.000295
Epoch [184/300], Train Loss: 0.000306
Validation Loss: 0.000294
Epoch [185/300], Train Loss: 0.000305
Validation Loss: 0.000295
Epoch [186/300], Train Loss: 0.000305
Validation Loss: 0.000295
Epoch [187/300], Train Loss: 0.000305
Validation Loss: 0.000293
Epoch [188/300], Train Loss: 0.000305
Validation Loss: 0.000294
Epoch [189/300], Train Loss: 0.000305
Validation Loss: 0.000294
Epoch [190/300], Train Loss: 0.000305
Validation Loss: 0.000299
Epoch [191/300], Train Loss: 0.000304
Validation Loss: 0.000296
Epoch [192/300], Train Loss: 0.000304
Validation Loss: 0.000293
Epoch [193/300], Train Loss: 0.000303
Validation Loss: 0.000291
Epoch [194/300], Train Loss: 0.000303
Validation Loss: 0.000291
Epoch [195/300], Train Loss: 0.000302
Validation Loss: 0.000292
Epoch [196/300], Train Loss: 0.000302
Validation Loss: 0.000292
Epoch [197/300], Train Loss: 0.000301
Validation Loss: 0.000289
Epoch [198/300], Train Loss: 0.000302
Validation Loss: 0.000293
Epoch [199/300], Train Loss: 0.000301
Validation Loss: 0.000293
Epoch [200/300], Train Loss: 0.000301
Validation Loss: 0.000289
Epoch [201/300], Train Loss: 0.000301
Validation Loss: 0.000294
Epoch [202/300], Train Loss: 0.000300
Validation Loss: 0.000289
Epoch [203/300], Train Loss: 0.000299
Validation Loss: 0.000290
Epoch [204/300], Train Loss: 0.000299
Validation Loss: 0.000294
Epoch [205/300], Train Loss: 0.000298
Validation Loss: 0.000287
Epoch [206/300], Train Loss: 0.000298
Validation Loss: 0.000288
Epoch [207/300], Train Loss: 0.000298
Validation Loss: 0.000288
Epoch [208/300], Train Loss: 0.000297
Validation Loss: 0.000288
Epoch [209/300], Train Loss: 0.000296
Validation Loss: 0.000288
Epoch [210/300], Train Loss: 0.000297
Validation Loss: 0.000289
Epoch [211/300], Train Loss: 0.000296
Validation Loss: 0.000285
Epoch [212/300], Train Loss: 0.000294
Validation Loss: 0.000285
Epoch [213/300], Train Loss: 0.000294
Validation Loss: 0.000287
Epoch [214/300], Train Loss: 0.000294
Validation Loss: 0.000284
Epoch [215/300], Train Loss: 0.000293
Validation Loss: 0.000281
Epoch [216/300], Train Loss: 0.000291
Validation Loss: 0.000280
Epoch [217/300], Train Loss: 0.000293
Validation Loss: 0.000280
Epoch [218/300], Train Loss: 0.000291
Validation Loss: 0.000282
Epoch [219/300], Train Loss: 0.000291
Validation Loss: 0.000279
Epoch [220/300], Train Loss: 0.000290
Validation Loss: 0.000280
Epoch [221/300], Train Loss: 0.000290
Validation Loss: 0.000278
Epoch [222/300], Train Loss: 0.000290
Validation Loss: 0.000280
Epoch [223/300], Train Loss: 0.000289
Validation Loss: 0.000279
Epoch [224/300], Train Loss: 0.000289
Validation Loss: 0.000277
Epoch [225/300], Train Loss: 0.000288
Validation Loss: 0.000276
Epoch [226/300], Train Loss: 0.000287
Validation Loss: 0.000277
Epoch [227/300], Train Loss: 0.000286
Validation Loss: 0.000277
Epoch [228/300], Train Loss: 0.000285
Validation Loss: 0.000275
Epoch [229/300], Train Loss: 0.000285
Validation Loss: 0.000274
Epoch [230/300], Train Loss: 0.000284
Validation Loss: 0.000275
Epoch [231/300], Train Loss: 0.000284
Validation Loss: 0.000289
Epoch [232/300], Train Loss: 0.000284
Validation Loss: 0.000274
Epoch [233/300], Train Loss: 0.000283
Validation Loss: 0.000275
Epoch [234/300], Train Loss: 0.000284
Validation Loss: 0.000275
Epoch [235/300], Train Loss: 0.000283
Validation Loss: 0.000276
Epoch [236/300], Train Loss: 0.000283
Validation Loss: 0.000272
Epoch [237/300], Train Loss: 0.000284
Validation Loss: 0.000271
Epoch [238/300], Train Loss: 0.000281
Validation Loss: 0.000270
Epoch [239/300], Train Loss: 0.000281
Validation Loss: 0.000273
Epoch [240/300], Train Loss: 0.000281
Validation Loss: 0.000271
Epoch [241/300], Train Loss: 0.000280
Validation Loss: 0.000270
Epoch [242/300], Train Loss: 0.000280
Validation Loss: 0.000271
Epoch [243/300], Train Loss: 0.000280
Validation Loss: 0.000272
Epoch [244/300], Train Loss: 0.000279
Validation Loss: 0.000268
Epoch [245/300], Train Loss: 0.000279
Validation Loss: 0.000270
Epoch [246/300], Train Loss: 0.000279
Validation Loss: 0.000269
Epoch [247/300], Train Loss: 0.000279
Validation Loss: 0.000268
Epoch [248/300], Train Loss: 0.000278
Validation Loss: 0.000267
Epoch [249/300], Train Loss: 0.000278
Validation Loss: 0.000269
Epoch [250/300], Train Loss: 0.000277
Validation Loss: 0.000268
Epoch [251/300], Train Loss: 0.000277
Validation Loss: 0.000268
Epoch [252/300], Train Loss: 0.000278
Validation Loss: 0.000268
Epoch [253/300], Train Loss: 0.000276
Validation Loss: 0.000266
Epoch [254/300], Train Loss: 0.000276
Validation Loss: 0.000268
Epoch [255/300], Train Loss: 0.000276
Validation Loss: 0.000265
Epoch [256/300], Train Loss: 0.000276
Validation Loss: 0.000268
Epoch [257/300], Train Loss: 0.000275
Validation Loss: 0.000264
Epoch [258/300], Train Loss: 0.000275
Validation Loss: 0.000265
Epoch [259/300], Train Loss: 0.000275
Validation Loss: 0.000265
Epoch [260/300], Train Loss: 0.000274
Validation Loss: 0.000271
Epoch [261/300], Train Loss: 0.000275
Validation Loss: 0.000265
Epoch [262/300], Train Loss: 0.000274
Validation Loss: 0.000266
Epoch [263/300], Train Loss: 0.000274
Validation Loss: 0.000265
Epoch [264/300], Train Loss: 0.000274
Validation Loss: 0.000264
Epoch [265/300], Train Loss: 0.000273
Validation Loss: 0.000264
Epoch [266/300], Train Loss: 0.000273
Validation Loss: 0.000265
Epoch [267/300], Train Loss: 0.000273
Validation Loss: 0.000265
Epoch [268/300], Train Loss: 0.000274
Validation Loss: 0.000264
Epoch [269/300], Train Loss: 0.000273
Validation Loss: 0.000264
Epoch [270/300], Train Loss: 0.000273
Validation Loss: 0.000263
Epoch [271/300], Train Loss: 0.000272
Validation Loss: 0.000262
Epoch [272/300], Train Loss: 0.000271
Validation Loss: 0.000261
Epoch [273/300], Train Loss: 0.000272
Validation Loss: 0.000262
Epoch [274/300], Train Loss: 0.000271
Validation Loss: 0.000263
Epoch [275/300], Train Loss: 0.000272
Validation Loss: 0.000261
Epoch [276/300], Train Loss: 0.000271
Validation Loss: 0.000261
Epoch [277/300], Train Loss: 0.000271
Validation Loss: 0.000262
Epoch [278/300], Train Loss: 0.000271
Validation Loss: 0.000261
Epoch [279/300], Train Loss: 0.000271
Validation Loss: 0.000261
Epoch [280/300], Train Loss: 0.000270
Validation Loss: 0.000262
Epoch [281/300], Train Loss: 0.000270
Validation Loss: 0.000263
Epoch [282/300], Train Loss: 0.000271
Validation Loss: 0.000260
Epoch [283/300], Train Loss: 0.000270
Validation Loss: 0.000262
Epoch [284/300], Train Loss: 0.000269
Validation Loss: 0.000263
Epoch [285/300], Train Loss: 0.000270
Validation Loss: 0.000262
Epoch [286/300], Train Loss: 0.000272
Validation Loss: 0.000259
Epoch [287/300], Train Loss: 0.000269
Validation Loss: 0.000259
Epoch [288/300], Train Loss: 0.000268
Validation Loss: 0.000258
Epoch [289/300], Train Loss: 0.000268
Validation Loss: 0.000259
Epoch [290/300], Train Loss: 0.000269
Validation Loss: 0.000263
Epoch [291/300], Train Loss: 0.000269
Validation Loss: 0.000260
Epoch [292/300], Train Loss: 0.000269
Validation Loss: 0.000259
Epoch [293/300], Train Loss: 0.000268
Validation Loss: 0.000260
Epoch [294/300], Train Loss: 0.000268
Validation Loss: 0.000258
Epoch [295/300], Train Loss: 0.000267
Validation Loss: 0.000259
Epoch [296/300], Train Loss: 0.000267
Validation Loss: 0.000257
Epoch [297/300], Train Loss: 0.000267
Validation Loss: 0.000258
Epoch [298/300], Train Loss: 0.000267
Validation Loss: 0.000257
Epoch [299/300], Train Loss: 0.000267
Validation Loss: 0.000257
Epoch [300/300], Train Loss: 0.000268
Validation Loss: 0.000257

Evaluating model for: Fridge
Validation MAE: 20.281553 W
Validation MSE: 885.595398 W²
Validation RMSE: 29.758955 W
Signal Aggregate Error (SAE): 0.008629
Normalized Disaggregation Error (NDE): 0.515304

      

Training and Validation Loss

Training Loss Plot

Interactive Plot